Identifying Brain Networks at Multiple Time Scales via Deep Recurrent Neural Network
نویسندگان
چکیده
منابع مشابه
Multiple Time Scales Recurrent Neural Network for Complex Action Acquisition
This paper presents preliminary results of complex action learning based on a multiple time-scales recurrent neural network (MTRNN) model embodied in the iCub humanoid robot. The model was implemented as part of Aquila cognitive robotics toolkit and accelerated through the compute unified device architecture (CUDA) making use of massively parallel GPU (graphics processing unit) devices that sig...
متن کاملDeep Gate Recurrent Neural Network
This paper explores the possibility of using multiplicative gate to build two recurrent neural network structures. These two structures are called Deep Simple Gated Unit (DSGU) and Simple Gated Unit (SGU), which are structures for learning long-term dependencies. Compared to traditional Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU), both structures require fewer parameters and le...
متن کاملDeep Recurrent Neural Networks for Time Series Prediction
— Ability of deep networks to extract high level features and of recurrent networks to perform time-series inference have been studied. In view of universality of one hidden layer network at approximating functions under weak constraints, the benefit of multiple layers is to enlarge the space of dynamical systems approximated or, given the space, reduce the number of units required for a certai...
متن کاملIdentifying Deep Contrasting Networks from Time Series Data: Application to Brain Network Analysis
The analysis of multiple time series data, which are generated from a networked system, has attracted much attention recently. This technique has been used in a wide range of applications including functional brain network analysis of neuroimaging data and social influence analysis. In functional brain network analysis, the activity of different brain regions can be represented as multiple time...
متن کاملMultiple-Weight Recurrent Neural Networks
Recurrent neural networks (RNNs) have enjoyed great success in speech recognition, natural language processing, etc. Many variants of RNNs have been proposed, including vanilla RNNs, LSTMs, and GRUs. However, current architectures are not particularly adept at dealing with tasks involving multi-faceted contents, i.e., data with a bimodal or multimodal distribution. In this work, we solve this p...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Journal of Biomedical and Health Informatics
سال: 2019
ISSN: 2168-2194,2168-2208
DOI: 10.1109/jbhi.2018.2882885